High-dimensional variable selection via low-dimensional adaptive learning
نویسندگان
چکیده
A stochastic search method, the so-called Adaptive Subspace (AdaSub) is proposed for variable selection in high-dimensional linear regression models. The method aims at finding best model with respect to a certain criterion and based on idea of adaptively solving low-dimensional sub-problems order provide solution original problem. Any usual $\ell _{0}$-type criteria can be used, such as Akaike’s Information Criterion (AIC), Bayesian (BIC) or Extended BIC (EBIC), last being particularly suitable cases. limiting properties new algorithm are analysed it shown that, under conditions, AdaSub converges according considered criterion. In simulation study, performance investigated comparison alternative methods. effectiveness illustrated via various simulated datasets real data example.
منابع مشابه
High Dimensional Variable Selection.
This paper explores the following question: what kind of statistical guarantees can be given when doing variable selection in high dimensional models? In particular, we look at the error rates and power of some multi-stage regression methods. In the first stage we fit a set of candidate models. In the second stage we select one model by cross-validation. In the third stage we use hypothesis tes...
متن کاملConsistent high-dimensional Bayesian variable selection via penalized credible regions.
For high-dimensional data, particularly when the number of predictors greatly exceeds the sample size, selection of relevant predictors for regression is a challenging problem. Methods such as sure screening, forward selection, or penalized regressions are commonly used. Bayesian variable selection methods place prior distributions on the parameters along with a prior over model space, or equiv...
متن کاملHigh-Dimensional Non-Linear Variable Selection through Hierarchical Kernel Learning
We consider the problem of high-dimensional non-linear variable selection for supervised learning. Our approach is based on performing linear selection among exponentially many appropriately defined positive definite kernels that characterize non-linear interactions between the original variables. To select efficiently from these many kernels, we use the natural hierarchical structure of the pr...
متن کاملHigh Dimensional Variable Selection with Error Control
Background. The iterative sure independence screening (ISIS) is a popular method in selecting important variables while maintaining most of the informative variables relevant to the outcome in high throughput data. However, it not only is computationally intensive but also may cause high false discovery rate (FDR). We propose to use the FDR as a screening method to reduce the high dimension to ...
متن کاملVariable Selection for High Dimensional Multivariate Outcomes.
We consider variable selection for high-dimensional multivariate regression using penalized likelihoods when the number of outcomes and the number of covariates might be large. To account for within-subject correlation, we consider variable selection when a working precision matrix is used and when the precision matrix is jointly estimated using a two-stage procedure. We show that under suitabl...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Electronic Journal of Statistics
سال: 2021
ISSN: ['1935-7524']
DOI: https://doi.org/10.1214/21-ejs1797